Necessary Conditions in Nonsmooth Minimization Via Lower and Upper Subgradients

نویسندگان

  • Boris S. Mordukhovich
  • BORIS S. MORDUKHOVICH
چکیده

The paper concerns first-order necessary optimality conditions for problems of minimizing nonsmooth functions under various constraints in infinite-dimensional spaces. Based on advanced tools of variational analysis and generalized differential calculus, we derive general results of two independent types called lower subdifferential and upper subdifferential optimality conditions. The former ones involve basic/limiting subgradients of cost functions, while the latter conditions are expressed via Frechetjregular upper subgradients in fairly general settings. All the upper subdifferential and major lower subdifferential optimality conditions obtained in the paper are new even in finite dimensions. We give applications of general optimality conditions to mathematical programs with equilibrium constraints deriving new results for this important class of intrinsically nonsmooth optimization problems. Mathematics Subject Classifications (2000): 49J52, 49K27, 90C48.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Ergodic Results in Subgradient Optimization

Subgradient methods are popular tools for nonsmooth, convex minimization , especially in the context of Lagrangean relaxation; their simplicity has been a main contribution to their success. As a consequence of the nonsmoothness, it is not straightforward to monitor the progress of a subgradient method in terms of the approximate fulllment of optimality conditions, since the subgradients used i...

متن کامل

A derivative-free comirror algorithm for convex optimization

We consider the minimization of a nonsmooth convex function over a compact convex set subject to a nonsmooth convex constraint. We work in the setting of derivative-free optimization (DFO), assuming that the objective and constraint functions are available through a black-box that provides function values for lower-C2 representation of the functions. Our approach is based on a DFO adaptation of...

متن کامل

Ergodic Convergence in Subgradient Optimization

When nonsmooth, convex minimizationproblems are solved by subgradientoptimizationmethods, the subgradients used will in general not accumulate to subgradients which verify the optimal-ity of a solution obtained in the limit. It is therefore not a straightforward task to monitor the progress of a subgradient method in terms of the approximate fulllment of optimality conditions. Further, certain ...

متن کامل

New Nonsmooth Equations-Based Algorithms for ℓ1-Norm Minimization and Applications

Recently, Xiao et al. proposed a nonsmooth equations-based method to solve the 1-norm minimization problem 2011 . The advantage of this method is its simplicity and lower storage. In this paper, based on new nonsmooth equations reformulation, we investigate new nonsmooth equations-based algorithms for solving 1-norm minimization problems. Under mild conditions, we show that the proposed algorit...

متن کامل

Accelerated first-order methods for large-scale convex minimization

This paper discusses several (sub)gradient methods attaining the optimal complexity for smooth problems with Lipschitz continuous gradients, nonsmooth problems with bounded variation of subgradients, weakly smooth problems with Hölder continuous gradients. The proposed schemes are optimal for smooth strongly convex problems with Lipschitz continuous gradients and optimal up to a logarithmic fac...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014